Research
Security News
Threat Actor Exposes Playbook for Exploiting npm to Build Blockchain-Powered Botnets
A threat actor's playbook for exploiting the npm ecosystem was exposed on the dark web, detailing how to build a blockchain-powered botnet.
@nlpjs/lang-en
Advanced tools
You can install @nlpjs/lang-en:
npm install @nlpjs/lang-en
Normalization of a text converts it to lowercase and remove decorations of characters.
const { NormalizerEn } = require('@nlpjs/lang-en');
const normalizer = new NormalizerEn();
const input = 'This shóuld be normalized';
const result = normalizer.normalize(input);
console.log(result);
// output: this should be normalized
Tokenization splits a sentence into words.
const { TokenizerEn } = require('@nlpjs/lang-en');
const tokenizer = new TokenizerEn();
const input = 'This isn\'t tokenized yet';
const result = tokenizer.tokenize(input);
console.log(result);
// output: [ 'This', 'is', 'not', 'tokenized', 'yet' ]
Tokenizer can also normalize the sentence before tokenizing, to do that provide a true as second argument to the method tokenize
const { TokenizerEn } = require('@nlpjs/lang-en');
const tokenizer = new TokenizerEn();
const input = 'This isn\'t tokenized yet';
const result = tokenizer.tokenize(input true);
console.log(result);
// output: [ 'this', 'is', 'not', 'tokenized', 'yet' ]
Using the class StopwordsEn you can identify if a word is an stopword:
const { StopwordsEn } = require('@nlpjs/lang-en');
const stopwords = new StopwordsEn();
console.log(stopwords.isStopword('is'));
// output: true
console.log(stopwords.isStopword('developer'));
// output: false
Using the class StopwordsEn you can remove stopwords form an array of words:
const { StopwordsEn } = require('@nlpjs/lang-en');
const stopwords = new StopwordsEn();
console.log(stopwords.removeStopwords(['who', 'is', 'your', 'develop']));
// output: ['develop']
Using the class StopwordsEn you can restart it dictionary and build it from another set of words:
const { StopwordsEn } = require('@nlpjs/lang-en');
const stopwords = new StopwordsEn();
stopwords.dictionary = {};
stopwords.build(['is', 'your']);
console.log(stopwords.removeStopwords(['who', 'is', 'your', 'develop']));
// output: ['who', 'develop']
An stemmer is an algorithm to calculate the stem (root) of a word, removing affixes.
You can stem one word using method stemWord:
const { StemmerEn } = require('@nlpjs/lang-en');
const stemmer = new StemmerEn();
const input = 'developer';
console.log(stemmer.stemWord(input));
// output: develop
You can stem an array of words using method stem:
const { StemmerEn } = require('@nlpjs/lang-en');
const stemmer = new StemmerEn();
const input = ['Who', 'is', 'your', 'developer'];
console.log(stemmer.stem(input));
// outuput: [ 'Who', 'is', 'your', 'develop' ]
As you can see, stemmer does not do internal normalization, so words with uppercases will remain uppercased. Also, stemmer works with lowercased affixes, so developer will be stemmed as develop but DEVELOPER will not be changed.
You can tokenize and stem a sentence, including normalization, with the method tokenizeAndStem:
const { StemmerEn } = require('@nlpjs/lang-en');
const stemmer = new StemmerEn();
const input = 'Who is your DEVELOPER';
console.log(stemmer.tokenizeAndStem(input));
// output: [ 'who', 'is', 'your', 'develop' ]
When calling tokenizeAndStem method from the class StemmerEn, the second parameter is a boolean to set if the stemmer must keep the stopwords (true) or remove them (false). Before using it, the stopwords instance must be set into the stemmer:
const { StemmerEn, StopwordsEn } = require('@nlpjs/lang-en');
const stemmer = new StemmerEn();
stemmer.stopwords = new StopwordsEn();
const input = 'who is your developer';
console.log(stemmer.tokenizeAndStem(input, false));
// output: ['develop']
To use sentiment analysis you'll need to create a new Container and use the plugin LangEn, because internally the SentimentAnalyzer class try to retrieve the normalizer, tokenizer, stemmmer and sentiment dictionaries from the container.
const { Container } = require('@nlpjs/core');
const { SentimentAnalyzer } = require('@nlpjs/sentiment');
const { LangEn } = require('@nlpjs/lang-en');
(async () => {
const container = new Container();
container.use(LangEn);
const sentiment = new SentimentAnalyzer({ container });
const result = await sentiment.process({ locale: 'en', text: 'I love cats'});
console.log(result.sentiment);
})();
// output:
// {
// score: 0.5,
// numWords: 3,
// numHits: 1,
// average: 0.16666666666666666,
// type: 'senticon',
// locale: 'en',
// vote: 'positive'
// }
The output of the sentiment analysis includes:
const { containerBootstrap } = require('@nlpjs/core');
const { Nlp } = require('@nlpjs/nlp');
const { LangEn } = require('@nlpjs/lang-en');
(async () => {
const container = await containerBootstrap();
container.use(Nlp);
container.use(LangEn);
const nlp = container.get('nlp');
nlp.settings.autoSave = false;
nlp.addLanguage('en');
// Adds the utterances and intents for the NLP
nlp.addDocument('en', 'goodbye for now', 'greetings.bye');
nlp.addDocument('en', 'bye bye take care', 'greetings.bye');
nlp.addDocument('en', 'okay see you later', 'greetings.bye');
nlp.addDocument('en', 'bye for now', 'greetings.bye');
nlp.addDocument('en', 'i must go', 'greetings.bye');
nlp.addDocument('en', 'hello', 'greetings.hello');
nlp.addDocument('en', 'hi', 'greetings.hello');
nlp.addDocument('en', 'howdy', 'greetings.hello');
// Train also the NLG
nlp.addAnswer('en', 'greetings.bye', 'Till next time');
nlp.addAnswer('en', 'greetings.bye', 'see you soon!');
nlp.addAnswer('en', 'greetings.hello', 'Hey there!');
nlp.addAnswer('en', 'greetings.hello', 'Greetings!');
await nlp.train();
const response = await nlp.process('en', 'I should go now');
console.log(response);
})();
You can read the guide of how to contribute at Contributing.
Made with contributors-img.
You can read the Code of Conduct at Code of Conduct.
?
This project is developed by AXA Group Operations Spain S.A.
If you need to contact us, you can do it at the email opensource@axa.com
Copyright (c) AXA Group Operations Spain S.A.
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
FAQs
Core
The npm package @nlpjs/lang-en receives a total of 9,358 weekly downloads. As such, @nlpjs/lang-en popularity was classified as popular.
We found that @nlpjs/lang-en demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 2 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Security News
A threat actor's playbook for exploiting the npm ecosystem was exposed on the dark web, detailing how to build a blockchain-powered botnet.
Security News
NVD’s backlog surpasses 20,000 CVEs as analysis slows and NIST announces new system updates to address ongoing delays.
Security News
Research
A malicious npm package disguised as a WhatsApp client is exploiting authentication flows with a remote kill switch to exfiltrate data and destroy files.